Direct Maximization of Average Precision by Hill-Climbing, with a Comparison to a Maximum Entropy Approach

نویسندگان

  • William T. Morgan
  • Warren R. Greiff
  • John C. Henderson
چکیده

We describe an algorithm for choosing term weights to maximize average precision. The algorithm performs successive exhaustive searches through single directions in weight space. It makes use of a novel technique for considering all possible values of average pre­ cision that arise in searching for a maximum in a given direction. We apply the algorithm and compare this algorithm to a maximum entropy approach.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Genetic and Hill Climbing Algorithms to Improve an Artificial Neural Networks Model for Water Consumption Prediction

No unique method has been so far specified for determining the number of neurons in hidden layers of Multi-Layer Perceptron (MLP) neural networks used for prediction. The present research is intended to optimize the number of neurons using two meta-heuristic procedures namely genetic and hill climbing algorithms. The data used in the present research for prediction are consumption data of water...

متن کامل

Variational Methods for Stochastic Optimization

In the study of graphical models, methods based on the concept of variational freeenergy bounds have been widely used for approximating functionals of probability distributions. In this paper, we provide a method based on the same principles that can be applied to problems of stochastic optimization. In particular, this method is based upon the same principles as the generalized EM algorithm. W...

متن کامل

DENCLUE 2.0: Fast Clustering Based on Kernel Density Estimation

The Denclue algorithm employs a cluster model based on kernel density estimation. A cluster is defined by a local maximum of the estimated density function. Data points are assigned to clusters by hill climbing, i.e. points going to the same local maximum are put into the same cluster. A disadvantage of Denclue 1.0 is, that the used hill climbing may make unnecessary small steps in the beginnin...

متن کامل

The Expectation-Maximization and Alternating Minimization Algorithms

The Expectation-Maximization (EM) algorithm is a hill-climbing approach to finding a local maximum of a likelihood function [7, 8]. The EM algorithm alternates between finding a greatest lower bound to the likelihood function (the “E Step”), and then maximizing this bound (the “M Step”). The EM algorithm belongs to a broader class of alternating minimization algorithms [6], which includes the A...

متن کامل

Noise-enhanced convolutional neural networks

Injecting carefully chosen noise can speed convergence in the backpropagation training of a convolutional neural network (CNN). The Noisy CNN algorithm speeds training on average because the backpropagation algorithm is a special case of the generalized expectation-maximization (EM) algorithm and because such carefully chosen noise always speeds up the EM algorithm on average. The CNN framework...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004